In the mid-1990s, Jeremy Dalton experienced a new reality when he stepped inside an arcade game for the very first time.
Pulling a heavy headset with “an octopus of thick cabling” down over his young eyes, Dalton grabbed a single controller with his hand, and started fighting robots. At the time, the game and immediate experience in this new immersive world were exciting, yet somewhat fleeting, Dalton said. But nearly 30 years later, his first experience with extended reality, like the bulky headset he wore, has proven difficult to shake off.
Now head of metaverse technologies at PwC UK in London and the author of Reality Check: How Immersive Technologies Can Transform Your Business , Dalton is completely grounded within an extended reality, one that’s evolved out of the arcades of his childhood and into the workplace. From the design and building of virtual worlds, to helping organizations train and work together more effectively using VR headsets — he sends out hundreds each month — Dalton is working to make extended reality, and the metaverse , our collective reality.
“The ability to immerse yourself in completely different worlds, just by putting on a device or stepping into a certain area — I think that’s an incredibly powerful concept, and a very magical one that has applications in both the consumer world and the business world,” Dalton told Built In. “And I think that’s very rare for a technology to be able to do.”
More on VR Innovation 34 Virtual Reality Companies to Know
Extended reality, or XR, is an ecosystem of immersive technologies — virtual reality, mixed reality and augmented reality. It dates back to as early as the 1800s when binocular-like devices called stereoscopes tricked viewers’ brains into seeing images in 3D. More than a century later, that same XR technology would inspire an even more immersive experience — the aptly named Sensorama, which provided viewers an auditory, visual and even olfactory adventure of riding a motorcycle in Brooklyn.
But for years after, XR would largely remain reserved for science fiction writers , computer scientists and stray visionaries , until finding its newborn deer-like footing in the 1990s thanks to virtual reality arcade games , like the one Dalton first experienced as a child. VR games with names like Dactyl Nightmare , Battlesphere and Total Destruction became expensive fixtures in arcades around the world, introducing a new generation of gamers to a new, extended reality while laying the groundwork for what will likely become a new way to live, work and play in the metaverse .
Coined by Neal Stephenson in his 1992 novel Snow Crash , the term “metaverse” described a virtual refuge where inhabitants flocked to escape their physical dystopian reality. Thirty years later, the metaverse is still difficult to define and just generally “vague and complex,” as described by Wired writer Eric Ravenscraft.
“The simplest way to think about the metaverse is that it’s a collection of virtual worlds — inhabited by real people — that you are free to explore and interact with others in.”
Matthew Ball, who previously served as global head of strategy for Amazon Studios and is currently CEO of the venture fund Epyllion, describes the metaverse this way in his book, The Metaverse and How It Will Revolutionize Everything:
“A massively scaled and interoperable network of real-time rendered 3D virtual worlds that can be experienced synchronously and persistently by an effectively unlimited number of users with an individual sense of presence, and with continuity of data, such as identity, history, entitlements, objects, communications and payments.”
But according to Dalton, “the simplest way to think about the metaverse is that it’s a collection of virtual worlds — inhabited by real people — that you are free to explore and interact with others in.”
However you decide to describe the metaverse, what’s clear is that the relationship between extended reality and the metaverse will continue to evolve. For many, VR headsets will be the preferred way to access the metaverse, but it likely won’t be the only way. Will AR glasses unlock the metaverse? What about contact lenses or some other combination of wearables? The future of XR will likely depend on how far we let this theoretical metaverse extend into our physical reality.
For years, XR would continue to be anchored in gaming and entertainment, only slowly creeping into broadcast sports and half-time shows and even the lived-literary zeitgeist .
But with Facebook’s $2 billion purchase of the VR technology company Oculus VR in 2014, and the social media company’s subsequent corporate transformation into Meta and corresponding bet on a speculative metaverse seven years later, the XR world has finally steadied itself within the collective consciousness — not only in gaming, but also in the wider corporate world where it’s helping workplaces become more efficient and collaborative.
“It’s not a future-facing concept,” Dalton said. “This value is being derived right now.”
Companies like Ford, DHL and Boeing have experienced just how effective XR technology can be.
According to Dalton, Ford has been using virtual reality since the early to mid-2000s to optimize production lines for safety, comfort and efficiency, using virtual reality and virtual production lines to test potential changes and the impact of those changes on workers.
“They’ll analyze the stresses on the human body as their ‘industrial athletes,’ as they’re called, lift a transmission over a gate,” Dalton said. “They’ll iterate and improve upon it, and then go back into virtual reality and test it again. Once it’s been optimized, they can then take the ready-to-go future production line and deploy it in the real world, thereby not needing to disrupt the original operations of the production line currently going on.”
The result? The injury rate among Ford’s more than 50,000 “industrial athletes” decreased by 70 percent, according to Dalton.
“It’s not a future-facing concept ... This value is being derived right now.”
Shipping company DHL, too, increased order picking efficiency by 15 percent using augmented reality, Dalton said, and Boeing has used AR to optimize aircraft production and reduce the time they spend on wiring. “If you think about the wiring that has to go on to airplanes, and the many, many kilometers of wiring that needs to be placed, it’s very complicated,” Dalton said. “But when you’ve got augmented reality glasses that can assist you, [you] make it a more efficient process to correctly implement that wiring.”
Seeing the value in XR isn’t necessarily a difficult barrier for business leaders to overcome, but successfully navigating and accessing the wide array of related software, hardware and content that’s available can prove overwhelming. That’s why Dynepic, an XR training company based in Reno, Nevada, provides companies and clients an open infrastructure to access immersive technologies through their DX Platform, which acts as one centralized “XR-optimized” hub.
“The government can put a course together that has an augmented reality lesson from one vendor, a VR lesson from another vendor, they can have AI operate on top of it, and then put that together with their own content and quizzes,” Dynepic’s CEO Krissa Watry told Built In. “And they get all the data back in one spot.”
Watry, a veteran of the U.S. Airforce, sees the military and healthcare sectors really pushing XR forward, at least financially, and she predicts corporate training and education will help open up the market even more.
“I think training will be a big driver,” Watry said. “We’ve seen that in the military, right? Pilot training, maintenance training, air traffic control — you name it. And they’re all different. They’re like a little microcosm of society in a way. And they’re testing it out and getting back really good stats on the effectiveness of XR.”
More on the Metaverse A Tale of 2 Metaverses
Extended reality is poised to transform the office, too. Dalton believes one day when a new team member joins an organization, no matter the size, they’ll receive, along with a laptop and mobile phone, an XR headset.
Much like today, Dalton envisions a work life where we’re using laptops at our desks and smartphones on the go. What will change is when we need to connect and collaborate with our distant colleagues in a workshop or training, and Zoom and Google Meet won’t cut it. That’s when we’ll pull on our VR headset and really immerse ourselves in our work.
According to a report by the consulting firm Deloitte, global businesses are planning to expand XR capabilities, which analysts believe will improve workplace experiences through increased collaboration and more inclusive decision-making processes. Virtual field visits are one expected use case, as is prototype development and training.
Watry, whose company recently acquired social gaming company SurrealVR, sees multiplayer and collaborative training as the next step for corporate training.
This shift will likely accelerate as developers steeped in gaming move into training, creating XR content that’s more engaging, with improved graphics and incentivization fueling immersive learning experiences. “I think gamers who make that transition to addressing corporate training have a head start,” Watry said. “They already know how to make amazing things. They already understand game dynamics and things like gamification.”
While gaming will continue to play an important role in shaping the immediate future of XR, the work of researchers like Talis Reks at the MIT.nano Immersion Lab — a multidisciplinary XR service center and workshop for data visualization , VR and AR tool prototyping, and software and hardware development — is poised to increase its capabilities.
At the lab, Reks, a VR/AR gaming technologist, trains new users in advanced XR tools, like headsets that record heart rate and measure the fluctuation in size of a user’s pupils. While these tools exceed the capabilities of any product geared, or soon to be geared, to consumers, they’re necessary to really push the limits of XR hardware, Reks told Built In.
Naturally, the advances and innovations made in the lab could one day trickle down and enhance the XR experience for everyday users. One such project is a human-computer interface that could be used in compact AR glasses of the future. Developed by Jeehwan Kim , an associate professor in material science and engineering at MIT, the interface features a gaze tracker that’s based on pupil dilation and relies on microLED screen technology, which is more energy efficient and compatible with see-through displays. Kim’s interface also uses “electronic, skin-based, controller-free” motion tracking, which has the potential to improve how users interact with AR/VR, according to an MIT News report .
Another project pairs XR technology with biofeedback tools like electromyography devices and motion tracking for sports training analysis of professional fencers. “The VR and AR can be used as a tool to help these individuals learn what they’re doing, and see it for themselves,” Reks said.
But the immediate future of extended reality — and the metaverse — may be determined by Apple’s entry into XR, which has been described as a “threshold moment” by researchers.
“After all, if ever a company could solve the problem of how to design a piece of equipment that would make you want to put a contraption on your face that would allow you entry to another world while your body existed in this one, it would be Apple,” writes Vanessa Friedman for the New York Times .
Dalton predicts greater adoption of XR technology once Apple releases their mixed reality headset , which could happen sometime in 2023, with AR glasses a few years later. Though Apple’s hardware will be marketed toward consumers, businesses will benefit from consumer adoption, Dalton said. The growing familiarity with XR technology, whether through at-home use of Apple’s headset, Quest, or some other yet-to-be developed technology, will surely creep into the office, much like smartphones and laptops.
“If we can find that middle ground of efficiency, immersion and keeping people intact with the real world, I think it would be very beneficial for a lot of different reasons.”
“It becomes less of a stretch mentally for you as a business leader to invest in the technology if you’re seeing that it is mature in the market,” Dalton said.
Reks even imagines a day when people show up to work only to find that external monitors and computers have vanished. “Everything is just on a headset that’s very customizable and efficient,” Reks told Built In. “If we can find that middle ground of efficiency, immersion and keeping people intact with the real world, I think it would be very beneficial for a lot of different reasons.”
Finding a healthy balance between virtual and physical worlds will be one challenge as the shift to the metaverse continues, not unlike the one faced with the introduction of smartphones and the eyes that subsequently became glued to them. “Now that we’ve seen it through other tools and technologies, it’s something to be aware of as this moves forward,” Reks said.
Many of the immediate, concrete challenges facing XR are, according to Dalton, related to cost, content and hardware, specifically headsets. “They need to be lighter, smaller, easier to put on and take off,” Dalton said. Many don’t accommodate headscarves or support eyeglasses, Dalton added, though new iterations of headsets are moving to a more glasses-like form with dials to adjust for eyesight, he said.
But more pressing is just a general acceptance and understanding of XR technology and the potential value it could offer businesses — a reticence greater adoption in the consumer market could quell, Dalton said. For people to truly grasp just how game-changing XR can be, there’s really only one solution — pulling a headset down over your eyes and experiencing a new reality.